Power Gain Pattern Synthesis via Successive Convex Approximation Technique
نویسندگان
چکیده
منابع مشابه
Antenna array pattern synthesis via convex optimization
We show that a variety of antenna array pattern synthesis problems can be expressed as convex optimization problems, which can be (numerically) solved with great eeciency by recently developed interior-point methods. The synthesis problems involve arrays with arbitrary geometry and element directivity, constraints on far and near eld patterns over narrow or broad frequency bandwidth, and some i...
متن کاملStochastic Successive Convex Approximation for Non-Convex Constrained Stochastic Optimization
This paper proposes a constrained stochastic successive convex approximation (CSSCA) algorithm to find a stationary point for a general non-convex stochastic optimization problem, whose objective and constraint functions are nonconvex and involve expectations over random states. The existing methods for non-convex stochastic optimization, such as the stochastic (average) gradient and stochastic...
متن کاملA Power Efficient Gain Enhancing Technique for Current Mirror
This work introduces a new and simple method for adjusting the gain of current mirror. The major advantage of the proposed architecture is that, unlike the conventional variable gain current mirror, it does not need the change of the biasing current to adjust current gain. Therefore, the power dissipation remains constant in all of the gain settings. In addition, the proposed variable gain curr...
متن کاملParallel Successive Convex Approximation for Nonsmooth Nonconvex Optimization
Consider the problem of minimizing the sum of a smooth (possibly non-convex) and a convex (possibly nonsmooth) function involving a large number of variables. A popular approach to solve this problem is the block coordinate descent (BCD) method whereby at each iteration only one variable block is updated while the remaining variables are held fixed. With the recent advances in the developments ...
متن کاملStochastic Training of Neural Networks via Successive Convex Approximations
This paper proposes a new family of algorithms for training neural networks (NNs). These are based on recent developments in the field of non-convex optimization, going under the general name of successive convex approximation (SCA) techniques. The basic idea is to iteratively replace the original (non-convex, highly dimensional) learning problem with a sequence of (strongly convex) approximati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2020
ISSN: 2169-3536
DOI: 10.1109/access.2020.3029034